The Diagonal Update for Unconstrained Optimization

نویسندگان
چکیده

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Multi-steps Symmetric Rank-one Update for Unconstrained Optimization

In this paper, we present a generalized Symmetric Rank-one (SR1) method by employing interpolatory polynomials in order to possess a more accurate information from more than one previous step. The basic idea is to incorporate the SR1 update within the framework of multi-step methods. Hence iterates could be interpolated by a curve in such a way that the consecutive points define the curves. How...

متن کامل

Accumulative Approach in Multistep Diagonal Gradient-Type Method for Large-Scale Unconstrained Optimization

This paper focuses on developing diagonal gradient-type methods that employ accumulative approach in multistep diagonal updating to determine a better Hessian approximation in each step. The interpolating curve is used to derive a generalization of the weak secant equation, which will carry the information of the local Hessian. The new parameterization of the interpolating curve in variable spa...

متن کامل

A New Diagonal Gradient-type Method for Large Scale Unconstrained Optimization

The main focus of this paper is to derive new diagonal updating scheme via the direct weak secant equation. This new scheme allows us to improve the accuracy of the Hessian’s approximation and is also capable to utilize information gathered about the function in previous iterations. It follows by an scaling approach that employs scaling parameter based upon the proposed weak secant equation to ...

متن کامل

An Efficient Conjugate Gradient Algorithm for Unconstrained Optimization Problems

In this paper, an efficient conjugate gradient method for unconstrained optimization is introduced. Parameters of the method are obtained by solving an optimization problem, and using a variant of the modified secant condition. The new conjugate gradient parameter benefits from function information as well as gradient information in each iteration. The proposed method has global convergence und...

متن کامل

A new hybrid conjugate gradient algorithm for unconstrained optimization

In this paper, a new hybrid conjugate gradient algorithm is proposed for solving unconstrained optimization problems. This new method can generate sufficient descent directions unrelated to any line search. Moreover, the global convergence of the proposed method is proved under the Wolfe line search. Numerical experiments are also presented to show the efficiency of the proposed algorithm, espe...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: JOURNAL OF EDUCATION AND SCIENCE

سال: 2012

ISSN: 2664-2530

DOI: 10.33899/edusj.2012.59199